174 research outputs found

    Multi-objective optimal designs in comparative clinical trials with covariates: The reinforced doubly adaptive biased coin design

    Full text link
    The present paper deals with the problem of allocating patients to two competing treatments in the presence of covariates or prognostic factors in order to achieve a good trade-off among ethical concerns, inferential precision and randomness in the treatment allocations. In particular we suggest a multipurpose design methodology that combines efficiency and ethical gain when the linear homoscedastic model with both treatment/covariate interactions and interactions among covariates is adopted. The ensuing compound optimal allocations of the treatments depend on the covariates and their distribution on the population of interest, as well as on the unknown parameters of the model. Therefore, we introduce the reinforced doubly adaptive biased coin design, namely a general class of covariate-adjusted response-adaptive procedures that includes both continuous and discontinuous randomization functions, aimed to target any desired allocation proportion. The properties of this proposal are described both theoretically and through simulations.Comment: Published in at http://dx.doi.org/10.1214/12-AOS1007 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    On the almost sure convergence of adaptive allocation procedures

    Get PDF
    In this paper, we provide some general convergence results for adaptive designs for treatment comparison, both in the absence and presence of covariates. In particular, we demonstrate the almost sure convergence of the treatment allocation proportion for a vast class of adaptive procedures, also including designs that have not been formally investigated but mainly explored through simulations, such as Atkinson's optimum biased coin design, Pocock and Simon's minimization method and some of its generalizations. Even if the large majority of the proposals in the literature rely on continuous allocation rules, our results allow to prove via a unique mathematical framework the convergence of adaptive allocation methods based on both continuous and discontinuous randomization functions. Although several examples of earlier works are included in order to enhance the applicability, our approach provides substantial insight for future suggestions, especially in the absence of a prefixed target and for designs characterized by sequences of allocation rules.Comment: Published at http://dx.doi.org/10.3150/13-BEJ591 in the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Simulated annealing for balancing covariates

    Get PDF
    Covariate balance is one of the fundamental issues in designing experiments for treatment comparisons, especially in randomized clinical trials. In this article, we introduce a new class of covariate-adaptive procedures based on the Simulated Annealing algorithm aimed at balancing the allocations of two competing treatments across a set of pre-specified covariates. Due to the nature of the simulated annealing, these designs are intrinsically randomized, thus completely unpredictable, and very flexible: they can manage both quantitative and qualitative factors and be implemented in a static version as well as sequentially. The properties of the suggested proposal are described, showing a significant improvement in terms of covariate balance and inferential accuracy with respect to all the other procedures proposed in the literature. An illustrative example based on real data is also discussed

    New insights into adaptive enrichment designs

    Get PDF
    The transition towards personalized medicine is happening and the new experimental framework is raising several challenges, from a clinical, ethical, logistical, regulatory, and statistical perspective. To face these challenges, innovative study designs with increasing complexity have been proposed. In particular, adaptive enrichment designs are becoming more attractive for their flexibility. However, these procedures rely on an increasing number of parameters that are unknown at the planning stage of the clinical trial, so the study design requires particular care. This review is dedicated to adaptive enrichment studies with a focus on design aspects. While many papers deal with methods for the analysis, the sample size determination and the optimal allocation problem have been overlooked. We discuss the multiple aspects involved in adaptive enrichment designs that contribute to their advantages and disadvantages. The decision-making process of whether or not it is worth enriching should be driven by clinical and ethical considerations as well as scientific and statistical concerns

    A simple solution to the inadequacy of asymptotic likelihood-based inference for response-adaptive clinical trials

    Get PDF
    The present paper discusses drawbacks and limitations of likelihood-based inference in sequential clinical trials for treatment comparisons managed viaResponse-Adaptive Randomization. Taking into account the most common statistical models for the primary outcome—namely binary, Poisson, exponential and normal data—we derive the conditions under which (i) the classical confidence intervals degenerate and (ii) the Wald test becomes inconsistent and strongly affected by the nuisance parameters, also displaying a non monotonic power. To overcome these drawbacks, we provide a very simple solution that could preserve the fundamental properties of likelihood-based inference. Several illustrative examples and simulation studies are presented in order to confirm the relevance of our results and provide some practical recommendations

    Reporting only relative effect measures was potentially misleading: some good practices for improving the soundness of epidemiological results

    Get PDF
    Objective: In the medical and epidemiological literature there is a growing tendency to report an excessive number of decimal digits (often three, sometimes four), especially when measures of relative occurrence are small; this can be misleading. Study Design and Setting: We combined mathematical and statistical reasoning about the precision of relative risks with the meaning of the decimal part of the same measures from biological and public health perspectives. Results: We identified a general rule for minimizing the mathematical error due to rounding of relative risks, depending on the background absolute rate, which justifies the use of one or more decimal digits for estimates close to 1. Conclusions: We suggest that both relative and absolute risk measures (expressed as a rates) should be reported, and two decimal digits should be used for relative risk close to 1 only if the background rate is at least 1/1,000 py. The use of more than two decimal digits is justified only when the background rate is high (i.e., 1/10 py)
    • …
    corecore